rm(list=ls(all=T))
library(sf)
library(tidyverse)
library(terra)
library(nhdplusTools)
library(mapview)
library(dataRetrieval)
library(lubridate)
library(prism)
library(ggspatial)
library(nngeo)# Added from original code
library(stars)# Added from original code
# this gives you an error, but it can be ignored:
try(plyr::ldply(list.files(path="src/",
pattern="*.R",
full.names=TRUE),
source))
FALSE Error in list_to_dataframe(res, attr(.data, "split_labels"), .id, id_as_factor) :
FALSE Results must be all atomic, or all data frames
# Rmarkdown options
knitr::opts_chunk$set(echo = T, warning = F, comment = F, message = F)
# mapview options
mapviewOptions(basemaps.color.shuffle=FALSE,basemaps='OpenTopoMap')
For this code to run properly, your site data must be configured as follows:
site.comid.data/
folder.Currently, this workflow requires downloading several data sets
locally for much speedier run times. This includes: PRISM climate &
aridity rasters, NHD flow direction data, and CONUS-wide NHD catchments.
All data sets are found in the shared data folder.
Use COMID to allow site linkages with all datasets in this workflow.
sf_use_s2(FALSE)
site_type = "comid" # This steps assumes you have checked your COMIDs
sites <- read_csv("v2_RCSFA_Geospatial_Data_Package/v2_RCSFA_Geospatial_Site_Information.csv")
# Filter the sites that are too small to have COMIDS or are outside of the US. Also select and change column names to match the code needs
sites = sites %>% filter(COMID!=-9999) %>% dplyr::select(site = 'Site_ID', comid = 'COMID')
Pull all meta data associated with each site’s COMID.
if(site_type == "comid"){
sites <- getNHDcomid(df = dplyr::select(sites, site, comid))
}
FALSE [1] "Your data has duplicate or triplicate COMIDs"
FALSE [1] "266 locations linked to the NHD."
Make NHD-based watershed shapefiles for all CONUS sites. To make this
step MUCH faster, it is best to have a locally downloaded version on the
National NHD catchment shapefile stored on your local system. I have
already included this shapefile in the data folder.
site_watersheds <- getWatersheds(df = sites, make_pretty = TRUE) %>%
inner_join(., select(sf::st_drop_geometry(sites), site, comid), by = "comid")
Map all your sites. Maps are automatically stored in the
data/maps/ folder. It is highly recommended to review each
map, particularly for known locations along BIG rivers and TINY
streams.Note: COMIDs should have been checked before running this
code.
suppressWarnings(map2(sites$site, sites$comid, getMaps))
Interactive map showing all sites and their delineated watersheds:
mapview(site_watersheds, col.regions = "#56B4E9", alpha.regions = 0.2, lwd = 3, layer.name = "Watershed") +
mapview(sites, cex = 5, col.regions = "black", layer.name = "Points") +
mapview(st_read('data/site_flowlines.gpkg', quiet = T), lwd = 3, color = "red", layer.name = "Flowline")